Escort Density Operators and Generalized Quantum Information Measures
نویسندگان
چکیده
منابع مشابه
Escort Density Operators and Generalized Quantum Information Measures
Parametrized families of density operators are studied. A generalization of the lower bound of Cramér and Rao is formulated. It involves escort density operators. The notion of φ-exponential family is introduced. This family, together with its escort, optimizes the generalized lower bound. It also satisfies a maximum entropy principle and exhibits a thermodynamic structure in which entropy and ...
متن کاملSchrödinger operators and information measures
The Rényi R[ρ] = 1 1−q ∫ ρ(x)dx and Shannon S[ρ] = − ∫ ρ(x) log ρ(x)dx entropies are information-theoretic measures which have enabled to formulate the position-momentum uncertainty principle in a much more adequate and stringent way than the (variance-based) Heisenberg-like relation. They are also the basis for their associated spreading lengths, that measure the spreading of a probability den...
متن کاملGeneralized information-entropy measures and Fisher information
We show how Fisher’s information already known particular character as the fundamental information geometric object which plays the role of a metric tensor for a statistical differential manifold, can be derived in a relatively easy manner through the direct application of a generalized logarithm and exponential formalism to generalized information-entropy measures. We shall first shortly descr...
متن کاملGeneralized Measures of Information Transfer
Transfer entropy provides a general tool for analyzing the magnitudes and directions—but not the kinds—of information transfer in a system. We extend transfer entropy in two complementary ways. First, we distinguish state-dependent from state-independent transfer, based on whether a source’s influence depends on the state of the target. Second, for multiple sources, we distinguish between uniqu...
متن کاملGeneralized information and entropy measures in physics
The formalism of statistical mechanics can be generalized by starting from more general measures of information than the Shannon entropy and maximizing those subject to suitable constraints. We discuss some of the most important examples of information measures that are useful for the description of complex systems. Examples treated are the Rényi entropy, Tsallis entropy, Abe entropy, Kaniadaki...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
ژورنال
عنوان ژورنال: Open Systems & Information Dynamics
سال: 2005
ISSN: 1230-1612,1793-7191
DOI: 10.1007/s11080-005-0483-5